---
title: Snowflake prediction job examples
description: Configure prediction jobs with Snowflake connections.

---

# Snowflake prediction job examples {: #snowflake-prediction-job-examples }

There are two ways to set up a batch prediction job definition for Snowflake:

* Using a [JDBC connector with Snowflake](#jdbc-with-snowflake) as an external data source.
* Using the [Snowflake adapter](#snowflake-with-an-external-stage) with an [external stage](glossary/index#external-stage).

??? tip "Which connection method should I use for Snowflake?"
    Using JDBC to transfer data can be costly in terms of IOPS (input/output operations per second) and expense for data warehouses. The Snowflake adapter reduces the load on database engines during prediction scoring by using cloud storage and bulk insert to create a hybrid JDBC-cloud storage solution.

## JDBC with Snowflake {: #jdbc-with-snowflake }

To complete these examples, follow the steps in [Create a prediction job definition](batch-pred-jobs#create-a-prediction-job-definition), using the following procedures to configure JDBC with Snowflake as your prediction source and destination.

### Configure JDBC with Snowflake as source {: #configure-jdbc-with-snowflake-as-source }

!!! tip
    See [Prediction intake options](intake-options#jdbc-scoring) for field descriptions.

1. For **Prediction source**, select **JDBC** as the **Source type** and click **+ Select connection**.

    ![](images/job-source-jdbc-snowflake-define-conn.png)

2. Select a previously added JDBC Snowflake [connection](data-conn).

    ![](images/job-source-dest-jdbc-snowflake-add-data-conn.png)

3. Select your Snowflake account.

    ![](images/job-source-dest-jdbc-snowflake-account.png)

4. Select your Snowflake schema.

    ![](images/job-source-dest-jdbc-snowflake-schema.png)

5. Select the table you want scored and click **Save connection**.

    ![](images/job-source-jdbc-snowflake-table.png)

6. Continue setting up the rest of the job definition. [Schedule](batch-pred-jobs#schedule-prediction-jobs) and save the definition. You can also run it immediately for testing. [Manage your jobs](batch-pred-jobs#manage-prediction-jobs) on the **Prediction Jobs** tab.


### Configure JDBC with Snowflake as destination {: #configure-jdbc-with-snowflake-as-destination }

!!! tip
    See [Prediction output options](output-options#jdbc-write) for field descriptions.

1. For **Prediction source**, select **JDBC** as the **Destination type** and click **+ Select connection**.

    ![](images/job-dest-jdbc-snowflake-define-conn.png)

2. Select a previously added JDBC Snowflake [connection](data-conn).

    ![](images/job-source-dest-jdbc-snowflake-add-data-conn.png)

3. Select your Snowflake account.

    ![](images/job-source-dest-jdbc-snowflake-account.png)

4. Select the schema you want to write the predictions to.

    ![](images/job-source-dest-jdbc-snowflake-schema.png)

5. Select a table or create a new table. If you create a new table, DataRobot creates the table with the proper features, and assigns the correct data type to each feature.

    ![](images/job-dest-jdbc-snowflake-table.png)   

6. Enter the table name and click **Save connection**.

    ![](images/job-dest-jdbc-snowflake-save-conn.png)

7. Select the **Write strategy**. In this case, **Insert** is selected because the table is new.

    ![](images/job-dest-jdbc-snowflake-write-strategy.png)     

8. Continue setting up the rest of the job definition. [Schedule](batch-pred-jobs#schedule-prediction-jobs) and save the definition. You can also run it immediately for testing. [Manage your jobs](batch-pred-jobs#manage-prediction-jobs) on the **Prediction Jobs** tab.  


## Snowflake with an external stage {: #snowflake-with-an-external-stage }

Before using the Snowflake adapter for job definitions, you need to:

* Set up the Snowflake [connection](data-conn#create-a-new-connection).

* Create an external stage for Snowflake, a cloud storage location used for loading and unloading data. You can create an [Amazon S3 stage](https://docs.snowflake.com/en/user-guide/data-load-s3-create-stage.html){ target=_blank } or a [Microsoft Azure stage](https://docs.snowflake.com/en/user-guide/data-load-azure-create-stage.html){ target=_blank }. You will need your account and authentication keys.

To complete these examples, follow the steps in [Create a prediction job definition](batch-pred-jobs#create-a-prediction-job-definition), using the following procedures to configure Snowflake as your prediction source and destination.


### Configure Snowflake with an external stage as source {: #configure-snowflake-with-an-external-stage-as-source }

!!! tip
    See [Prediction intake options](intake-options#snowflake-scoring) for field descriptions.


1. For **Prediction source**, select **Snowflake** as the **Source type** and click **+ Select connection**.

    ![](images/job-source-snowflake-define-conn.png)

2. Select a previously added Snowflake [connection](data-conn).

    ![](images/job-source-dest-snowflake-add-data-conn.png)

3. Select your Snowflake account.

    ![](images/job-source-dest-snowflake-account.png)

4. Select your Snowflake schema.

    ![](images/job-source-dest-snowflake-schema.png)

5. Select the table you want scored and click **Save connection**.

    ![](images/job-source-dest-snowflake-table.png)

6. Toggle on **Use external stage** and select your **Cloud storage type** (Azure or S3).

    ![](images/job-source-snowflake-cloud-storage-type.png)

7. Enter the **External stage** you created for your Snowflake account. Enable **This external stage requires credentials** and click **+ Add credentials**.

    ![](images/job-source-snowflake-external-stage.png)

8. Select your credentials.

    ![](images/job-source-dest-snowflake-s3-credentials.png)

    The completed **Prediction source** section looks like the following:

    ![](images/job-final-snowflake-source.png)

9. Continue setting up the rest of the job definition. [Schedule](batch-pred-jobs#schedule-prediction-jobs) and save the definition. You can also run it immediately for testing. [Manage your jobs](batch-pred-jobs#manage-prediction-jobs) on the **Prediction** Jobs tab.


### Configure Snowflake with an external stage as destination {: #configure-snowflake-with-an-external-stage-as-destination }

!!! tip
    See [Prediction output options](output-options#snowflake-write) for field descriptions.

1. For **Prediction destination**, select **Snowflake** as the **Destination type** and click **+ Select connection**.

    ![](images/job-dest-snowflake-define-conn.png)

2. Select a previously added Snowflake [connection](data-conn).

    ![](images/job-source-dest-snowflake-add-data-conn.png)

3. Select your Snowflake account.

    ![](images/job-source-dest-snowflake-account.png)

4. Select your Snowflake schema.

    ![](images/job-source-dest-snowflake-schema.png)

5. Select a table or create a new table. If you create a new table, DataRobot creates the table with the proper features, and assigns the correct data type to each feature.

    ![](images/job-dest-snowflake-table.png)   

6. Enter the table name and click **Save connection**.

    ![](images/job-dest-snowflake-save-conn.png)

7. Toggle on **Use external stage** and select your **Cloud storage type** (Azure or S3).

    ![](images/job-dest-snowflake-cloud-storage-type.png)

8. Enter the **External stage** you created for your Snowflake account. Enable **This external stage requires credentials** and click **+ Add credentials**.

    ![](images/job-dest-snowflake-external-stage.png)

9. Select your credentials.

    ![](images/job-source-dest-snowflake-s3-credentials.png)

    The completed **Prediction destination** section looks like the following:

    ![](images/job-final-snowflake-dest.png)

10. Continue setting up the rest of the job definition. [Schedule](batch-pred-jobs#schedule-prediction-jobs) and save the definition. You can also run it immediately for testing. [Manage your jobs](batch-pred-jobs#manage-prediction-jobs) on the **Prediction** Jobs tab.
